Bayesian Regularisation and Pruning using a Laplace Prior
نویسنده
چکیده
Standard techniques for improved generalisation from neural networks include weight decay and pruning. Weight decay has a Bayesian interpretation with the decay function corresponding to a prior over weights. The method of transformation groups and maximum entropy indicates a Laplace rather than a Gaussian prior. After training, the weights then arrange themselves into two classes: (1) those with a common sensitivity to the data error (2) those failing to achieve this sensitivity and which therefore vanish. Since the critical value is determined adaptively during training , pruning|in the sense of setting weights to exact zeros|becomes a consequence of regularisation alone. The count of free parameters is also reduced automatically as weights are pruned. A comparison is made with results of MacKay using the evidence framework and a Gaussian regulariser.
منابع مشابه
Bayesian Logistic Regression Model Choice via Laplace-Metropolis Algorithm
Following a Bayesian statistical inference paradigm, we provide an alternative methodology for analyzing a multivariate logistic regression. We use a multivariate normal prior in the Bayesian analysis. We present a unique Bayes estimator associated with a prior which is admissible. The Bayes estimators of the coefficients of the model are obtained via MCMC methods. The proposed procedure...
متن کاملSparse Multinomial Logistic Regression via Bayesian L1 Regularisation
Multinomial logistic regression provides the standard penalised maximumlikelihood solution to multi-class pattern recognition problems. More recently, the development of sparse multinomial logistic regression models has found application in text processing and microarray classification, where explicit identification of the most informative features is of value. In this paper, we propose a spars...
متن کاملProbabilistic non-linear registration with spatially adaptive regularisation
This paper introduces a novel method for inferring spatially varying regularisation in non-linear registration. This is achieved through full Bayesian inference on a probabilistic registration model, where the prior on the transformation parameters is parameterised as a weighted mixture of spatially localised components. Such an approach has the advantage of allowing the registration to be more...
متن کاملGene Selection in Cancer Classification using Sparse Logistic Regression with Bayesian Regularisation
Motivation: Gene selection algorithms for cancer classification, based on the expression of a small number of biomarker genes, have been the subject of considerable research in recent years. Shevade and Keerthi (2003) propose a gene selection algorithm based on sparse logistic regression (SLogReg) incorporating a Laplace prior to promote sparsity in the model parameters, and provide a simple bu...
متن کاملFunctional Brain Response to Emotional Muical Stimuli in Depression, Using INLA Approach for Approximate Bayesian Inference
Introduction: One of the vital skills which has an impact on emotional health and well-being is the regulation of emotions. In recent years, the neural basis of this process has been considered widely. One of the powerful tools for eliciting and regulating emotion is music. The Anterior Cingulate Cortex (ACC) is part of the emotional neural circuitry involved in Major Depressive Disorder (MDD)....
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 1994